|
In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate ''H''(''X'') is the limit of the joint entropy of ''n'' members of the process ''X''''k'' divided by ''n'', as ''n'' tends to infinity: : when the limit exists. An alternative, related quantity is: : For strongly stationary stochastic processes, . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property. == Entropy rates for Markov chains == Since a stochastic process defined by a Markov chain that is irreducible and aperiodic and persistent has a stationary distribution, the entropy rate is independent of the initial distribution. For example, for such a Markov chain ''Y''''k'' defined on a countable number of states, given the transition matrix ''P''''ij'', ''H''(''Y'') is given by: : where ''μ''''i'' is the stationary distribution of the chain. A simple consequence of this definition is that an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「entropy rate」の詳細全文を読む スポンサード リンク
|